An iterative coordinate descent algorithm to compute sparse low-rank approximations
نویسندگان
چکیده
In this paper, we describe a new algorithm to build few sparse principal components from given data matrix. Our approach does not explicitly create the covariance matrix of and can be viewed as an extension Kogbetliantz approximate singular value decomposition for components. We show performance proposed recover on various datasets literature perform dimensionality reduction classification applications.
منابع مشابه
Sparse PCA through Low-rank Approximations
We introduce a novel algorithm that computes the k-sparse principal component of a positive semidefinite matrix A. Our algorithm is combinatorial and operates by examining a discrete set of special vectors lying in a low-dimensional eigen-subspace of A. We obtain provable approximation guarantees that depend on the spectral profile of the matrix: the faster the eigenvalue decay, the better the ...
متن کاملBlock Low-Rank (BLR) approximations to improve multifrontal sparse solvers
Matrices coming from elliptic Partial Differential Equations (PDEs) have been shown to have a lowrank property: well defined off-diagonal blocks of their Schur complements can be approximated by low-rank products. In the multifrontal context, this can be exploited within the fronts in order to obtain a substantial reduction of the memory requirement and an efficient way to perform many of the b...
متن کاملThe Mixing method: coordinate descent for low-rank semidefinite programming
In this paper, we propose a coordinate descent approach to low-rank structured semidefinite programming. The approach, which we call the Mixing method, is extremely simple to implement, has no free parameters, and typically attains an order of magnitude or better improvement in optimization performance over the current state of the art. We show that for certain problems, the method is strictly ...
متن کاملNon-homogeneous updates for the iterative coordinate descent algorithm
Statistical reconstruction methods show great promise for improving resolution, and reducing noise and artifacts in helical X-ray CT. In fact, statistical reconstruction seems to be particularly valuable in maintaining reconstructed image quality when the dosage is low and the noise is therefore high. However, high computational cost and long reconstruction times remain as a barrier to the use ...
متن کاملSparse Random Feature Algorithm as Coordinate Descent in Hilbert Space
In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predictor by minimizing an l1-regularized objective function over the Hilbert Space induced from a kernel function. By interpreting the algorithm as Randomized Coordinate Descent in an infinite-dimensional space, we show the proposed approach converges to a solution within ε-precision of that using an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Signal Processing Letters
سال: 2021
ISSN: ['1558-2361', '1070-9908']
DOI: https://doi.org/10.1109/lsp.2021.3132276